2,040 research outputs found

    "Job Quality, Labor Market Segmentation, and Earning Inequality: Effects of Economic Restructuring in the 1980s by Race and Gender"

    Get PDF
    The authors examine the effects of employment restructuring in the 1980s on white, black, and Hispanic men and women within a labor market segmentation framework. Cluster analysis is used to determine whether jobs can be grouped into a small number of relatively homogeneous clusters on the basis of differences in job quality. With data centered on 1979, 621 occupation/ industry cells covering 94% of the workforce are analyzed with 17 measures of job quality, ranging from earnings and benefits to skill requirements and working conditions. The paper finds strong support for dual and tripartite schemes that closely resemble those described, but never satisfactorily verified, by the segmented labor market (SLM) literature of the 1970s: the "primary" (independent and subordinate) and "secondary" segments. But the findings also show that each of these three large segments consists of two distinct and easily interpretable job clusters that are significantly different from one another in race and gender composition. The job structure has become more bifurcated in the 1980s, as "middle-class" jobs (the subordinate primary segment) declined sharply and the workforce was increasingly employed in either the best (independent primary) or the worst (secondary) jobs. White women became much more concentrated at the top, while white men and black and Hispanic women were redistributed to both ends of the job structure. Black and Hispanic men, however, increased their presence only in the two secondary job clusters. Meanwhile, the quality of secondary jobs declined considerably, at least as measured by earnings, benefits, union coverage, and involuntary part-time employment. As these results would suggest, the paper research found that earnings differentials by cluster, controlling for education and experience, increased in the 1980s. The male and female wage gap also increased, as did the portion of these increasing differentials that were accounted for by changes in the distribution of racial groups among clusters.

    Handling congestion in crowd motion modeling

    Full text link
    We address here the issue of congestion in the modeling of crowd motion, in the non-smooth framework: contacts between people are not anticipated and avoided, they actually occur, and they are explicitly taken into account in the model. We limit our approach to very basic principles in terms of behavior, to focus on the particular problems raised by the non-smooth character of the models. We consider that individuals tend to move according to a desired, or spontanous, velocity. We account for congestion by assuming that the evolution realizes at each time an instantaneous balance between individual tendencies and global constraints (overlapping is forbidden): the actual velocity is defined as the closest to the desired velocity among all admissible ones, in a least square sense. We develop those principles in the microscopic and macroscopic settings, and we present how the framework of Wasserstein distance between measures allows to recover the sweeping process nature of the problem on the macroscopic level, which makes it possible to obtain existence results in spite of the non-smooth character of the evolution process. Micro and macro approaches are compared, and we investigate the similarities together with deep differences of those two levels of description

    A Commercially Reasonable Sale under Article 9: Commercial, Reasonable, and Fair to All Involved

    Get PDF

    Following red blood cells in a pulmonary capillary

    Full text link
    The red blood cells or erythrocytes are biconcave shaped cells and consist mostly in a membrane delimiting a cytosol with a high concentration in hemoglobin. This membrane is highly deformable and allows the cells to go through narrow passages like the capillaries which diameters can be much smaller than red blood cells one. They carry oxygen thanks to hemoglobin, a complex molecule that have very high affinity for oxygen. The capacity of erythrocytes to load and unload oxygen is thus a determinant factor in their efficacy. In this paper, we will focus on the pulmonary capillary where red blood cells capture oxygen. We propose a camera method in order to numerically study the behavior of the red blood cell along a whole capillary. Our goal is to understand how erythrocytes geometrical changes along the capillary can affect its capacity to capture oxygen. The first part of this document presents the model chosen for the red blood cells along with the numerical method used to determine and follow their shapes along the capillary. The membrane of the red blood cell is complex and has been modelled by an hyper-elastic approach coming from Mills et al (2004). This camera method is then validated and confronted with a standard ALE method. Some geometrical properties of the red blood cells observed in our simulations are then studied and discussed. The second part of this paper deals with the modeling of oxygen and hemoglobin chemistry in the geometries obtained in the first part. We have implemented a full complex hemoglobin behavior with allosteric states inspired from Czerlinski et al (1999).Comment: 17 page

    Generalized Wasserstein distance and its application to transport equations with source

    Full text link
    In this article, we generalize the Wasserstein distance to measures with different masses. We study the properties of such distance. In particular, we show that it metrizes weak convergence for tight sequences. We use this generalized Wasserstein distance to study a transport equation with source, in which both the vector field and the source depend on the measure itself. We prove existence and uniqueness of the solution to the Cauchy problem when the vector field and the source are Lipschitzian with respect to the generalized Wasserstein distance

    The Breaks in per Capita Productivity Trends in a Number of Industrial Countries

    Get PDF
    The purpose of this article is to study the trends in per capita productivity in several major industrialised countries. The analysis is first based on annual data over a long period spanning the entire 20th century for the United States, France and the United Kingdom. Productivity trends are then studied over a shorter period, using quarterly data, for the United States, France, the United Kingdom, Germany, Spain, Japan and the Netherlands. There are already a large number of studies of this kind, but they are too often focused on presenting average productivity growth rates for given periods chosen on an ad hoc basis. In this article, we use a robust statistical method to endogenously identify possible breaks in per capita productivity trends. This method, developed by Bai and Perron (1998), brings out the following salient features: – in the United States, per capita productivity growth accelerated following the trend break at the start of the 1920s, then slowed down at the end of the 1960s. This finding is in line with the “Big Wave” concept developed by Gordon (1999, 2002) to describe the trends in US productivity growth throughout the 20 th century. – French and UK productivity started catching up with that in the United States around the end of the Second World War. – Most of the countries under review recorded slower trend productivity growth in the first half of the 1970s. In the United States, this break occurred in 1966. This finding differs from that of other existing analyses, which point to 1974. – Trend productivity growth in Europe and Japan slowed in the 1990s, whereas US productivity gained momentum over the same period.Productivity trends ; Structural breaks ; Bai and Perron method
    • …
    corecore